4 research outputs found

    Investigating and Mitigating Failure Modes in Physics-informed Neural Networks (PINNs)

    Full text link
    This paper explores the difficulties in solving partial differential equations (PDEs) using physics-informed neural networks (PINNs). PINNs use physics as a regularization term in the objective function. However, a drawback of this approach is the requirement for manual hyperparameter tuning, making it impractical in the absence of validation data or prior knowledge of the solution. Our investigations of the loss landscapes and backpropagated gradients in the presence of physics reveal that existing methods produce non-convex loss landscapes that are hard to navigate. Our findings demonstrate that high-order PDEs contaminate backpropagated gradients and hinder convergence. To address these challenges, we introduce a novel method that bypasses the calculation of high-order derivative operators and mitigates the contamination of backpropagated gradients. Consequently, we reduce the dimension of the search space and make learning PDEs with non-smooth solutions feasible. Our method also provides a mechanism to focus on complex regions of the domain. Besides, we present a dual unconstrained formulation based on Lagrange multiplier method to enforce equality constraints on the model's prediction, with adaptive and independent learning rates inspired by adaptive subgradient methods. We apply our approach to solve various linear and non-linear PDEs

    A Generalized Schwarz-type Non-overlapping Domain Decomposition Method using Physics-constrained Neural Networks

    Full text link
    We present a meshless Schwarz-type non-overlapping domain decomposition method based on artificial neural networks for solving forward and inverse problems involving partial differential equations (PDEs). To ensure the consistency of solutions across neighboring subdomains, we adopt a generalized Robin-type interface condition, assigning unique Robin parameters to each subdomain. These subdomain-specific Robin parameters are learned to minimize the mismatch on the Robin interface condition, facilitating efficient information exchange during training. Our method is applicable to both the Laplace's and Helmholtz equations. It represents local solutions by an independent neural network model which is trained to minimize the loss on the governing PDE while strictly enforcing boundary and interface conditions through an augmented Lagrangian formalism. A key strength of our method lies in its ability to learn a Robin parameter for each subdomain, thereby enhancing information exchange with its neighboring subdomains. We observe that the learned Robin parameters adapt to the local behavior of the solution, domain partitioning and subdomain location relative to the overall domain. Extensive experiments on forward and inverse problems, including one-way and two-way decompositions with crosspoints, demonstrate the versatility and performance of our proposed approach

    Scientific Machine Learning for Transport Phenomena in Thermal and Fluid Sciences

    No full text
    Physics-informed neural networks (PINNs) have become popular as part of the rapidly expanding deep learning field in recent years. However, their origins date back to the early 1990s, when neural networks were adopted as meshless numerical methods to solve partial differential equations (PDEs). PINNs incorporate equations of known physics into the objective function as a regularization term, necessitating hyperparameter tuning to ensure convergence. Lack of a validation dataset or a priori knowledge of the solution can make PINNs impractical. Moreover, learning inverse PDE problems with noisy data can be difficult since it can lead to overfitting noise or underfitting high-fidelity data. To overcome these obstacles, this dissertation introduces physics and equality constrained artificial neural networks (PECANNs) as a deep learning framework for forward and inverse PDE problems. The backbone of this framework is a constrained optimization formulation that embeds governing equations along any available data in a principled fashion using an adaptive augmented Lagrangian method. Additionally, the framework is extended to learn the solution of large-scale PDE problems through a novel Schwarz-type domain decomposition method with a generalized Robin-type interface condition. The efficacy and versatility of the PECANN approach are demonstrated by solving several challenging forward and inverse PDE problems that arise in thermal and fluid sciences

    Physics and equality constrained artificial neural networks: Application to forward and inverse problems with multi-fidelity data fusion

    No full text
    Physics-informed neural networks (PINNs) have been proposed to learn the solution of partial differential equations (PDE). In PINNs, the residual form of the PDE of interest and its boundary conditions are lumped into a composite objective function as soft penalties. Here, we show that this specific way of formulating the objective function is the source of severe limitations in the PINN approach when applied to different kinds of PDEs. To address these limitations, we propose a versatile framework based on a constrained optimization problem formulation, where we use the augmented Lagrangian method (ALM) to constrain the solution of a PDE with its boundary conditions and any high-fidelity data that may be available. Our approach is adept at forward and inverse problems with multi-fidelity data fusion. We demonstrate the efficacy and versatility of our physics- and equality-constrained deep-learning framework by applying it to several forward and inverse problems involving multi-dimensional PDEs. Our framework achieves orders of magnitude improvements in accuracy levels in comparison with state-of-the-art physics-informed neural networks
    corecore